On Gaussian Radial Basis Function Approximations: Interpretation, Extensions, and Learning Strategies

نویسنده

  • Mário A. T. Figueiredo
چکیده

In this paper we focus on an interpretation of Gaussian radial basis functions (GRBF) which motivates extensions and learning strategies. Specifically, we show that GRBF regression equations naturally result from representing the input-output joint probability density function by a finite mixture of Gaussians. Corollaries of this interpretation are: some special forms of GRBF representations can be traced back to the type of Gaussian mixture used; previously proposed learning methods based on input-output clustering have a new meaning; finally, estimation techniques for finite mixtures (namely the EM algorithm, and model selection criteria) can be invoked to learn GRBF regression equations.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Novel Radial Basis Function Neural Networks based on Probabilistic Evolutionary and Gaussian Mixture Model for Satellites Optimum Selection

In this study, two novel learning algorithms have been applied on Radial Basis Function Neural Network (RBFNN) to approximate the functions with high non-linear order. The Probabilistic Evolutionary (PE) and Gaussian Mixture Model (GMM) techniques are proposed to significantly minimize the error functions. The main idea is concerning the various strategies to optimize the procedure of Gradient ...

متن کامل

Stable Gaussian radial basis function method for solving Helmholtz equations

‎Radial basis functions (RBFs) are a powerful tool for approximating the solution of high-dimensional problems‎. ‎They are often referred to as a meshfree method and can be spectrally accurate‎. ‎In this paper, we analyze a new stable method for evaluating Gaussian radial basis function interpolants based on the eigenfunction expansion‎. ‎We develop our approach in two-dimensional spaces for so...

متن کامل

On the construction and training of reformulated radial basis function neural networks

Presents a systematic approach for constructing reformulated radial basis function (RBF) neural networks, which was developed to facilitate their training by supervised learning algorithms based on gradient descent. This approach reduces the construction of radial basis function models to the selection of admissible generator functions. The selection of generator functions relies on the concept...

متن کامل

Faster Gaussian Summation: Theory and Experiment

We provide faster algorithms for the problem of Gaussian summation, which occurs in many machine learning methods. We develop two new extensions an O(D) Taylor expansion for the Gaussian kernel with rigorous error bounds and a new error control scheme integrating any arbitrary approximation method within the best discretealgorithmic framework using adaptive hierarchical data structures. We rigo...

متن کامل

Nonconvex Optimization Using a Fokker-planck Learning Machine

A new algorithm for nonconvex optimization by means of a so-called Fokker-Planck Learning Machine is proposed in this paper. This is done by considering the Fokker-Planck (FP) equation related to continuous simulated annealing, which has been proven to convergence to the global optimum under certain conditions. An approximate solution to the FP equation is sought by parametrizing the transition...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2000